27 research outputs found
Limit complexities revisited [once more]
The main goal of this article is to put some known results in a common
perspective and to simplify their proofs.
We start with a simple proof of a result of Vereshchagin saying that
equals . Then we use the same argument to prove
similar results for prefix complexity, a priori probability on binary tree, to
prove Conidis' theorem about limits of effectively open sets, and also to
improve the results of Muchnik about limit frequencies. As a by-product, we get
a criterion of 2-randomness proved by Miller: a sequence is 2-random if and
only if there exists such that any prefix of is a prefix of some
string such that . (In the 1960ies this property was
suggested in Kolmogorov as one of possible randomness definitions.) We also get
another 2-randomness criterion by Miller and Nies: is 2-random if and only
if for some and infinitely many prefixes of .
This is a modified version of our old paper that contained a weaker (and
cumbersome) version of Conidis' result, and the proof used low basis theorem
(in quite a strange way). The full version was formulated there as a
conjecture. This conjecture was later proved by Conidis. Bruno Bauwens
(personal communication) noted that the proof can be obtained also by a simple
modification of our original argument, and we reproduce Bauwens' argument with
his permission.Comment: See http://arxiv.org/abs/0802.2833 for the old pape
Effective bounds for convergence, descriptive complexity, and natural examples of simple and hypersimple sets
AbstractLet μ be a universal lower enumerable semi-measure (defined by L. Levin). Any computable upper bound for μ can be effectively separated from zero with a constant (this is similar to a theorem of G. Marandzhyan).Computable positive lower bounds for μ can be nontrivial and allow one to construct natural examples of hypersimple sets (introduced by E. Post)
Limit complexities revisited
The main goal of this paper is to put some known results in a common
perspective and to simplify their proofs. We start with a simple proof of a
result from (Vereshchagin, 2002) saying that \limsup_n\KS(x|n) (here
\KS(x|n) is conditional (plain) Kolmogorov complexity of when is
known) equals \KS^{\mathbf{0'}(x), the plain Kolmogorov complexity with
\mathbf{0'-oracle. Then we use the same argument to prove similar results for
prefix complexity (and also improve results of (Muchnik, 1987) about limit
frequencies), a priori probability on binary tree and measure of effectively
open sets. As a by-product, we get a criterion of Martin-L\"of
randomness (called also 2-randomness) proved in (Miller, 2004): a sequence
is 2-random if and only if there exists such that any prefix
of is a prefix of some string such that \KS(y)\ge |y|-c. (In the
1960ies this property was suggested in (Kolmogorov, 1968) as one of possible
randomness definitions; its equivalence to 2-randomness was shown in (Miller,
2004) while proving another 2-randomness criterion (see also (Nies et al.
2005)): is 2-random if and only if \KS(x)\ge |x|-c for some and
infinitely many prefixes of . Finally, we show that the low-basis
theorem can be used to get alternative proofs for these results and to improve
the result about effectively open sets; this stronger version implies the
2-randomness criterion mentioned in the previous sentence
Game interpretation of Kolmogorov complexity
The Kolmogorov complexity function K can be relativized using any oracle A,
and most properties of K remain true for relativized versions. In section 1 we
provide an explanation for this observation by giving a game-theoretic
interpretation and showing that all "natural" properties are either true for
all sufficiently powerful oracles or false for all sufficiently powerful
oracles. This result is a simple consequence of Martin's determinacy theorem,
but its proof is instructive: it shows how one can prove statements about
Kolmogorov complexity by constructing a special game and a winning strategy in
this game. This technique is illustrated by several examples (total conditional
complexity, bijection complexity, randomness extraction, contrasting plain and
prefix complexities).Comment: 11 pages. Presented in 2009 at the conference on randomness in
Madison
Universal convergence of semimeasures on individual random sequences, in
Solomonoff’s central result on induction is that the posterior of a universal semimeasure M converges rapidly and with probability 1 to the true sequence generating posterior µ, if the latter is computable. Hence, M is eligible as a universal sequence predictor in case of unknown µ. Despite some nearby results and proofs in the literature, the stronger result of convergence for all (Martin-Löf) random sequences remained open. Such a convergence result would be particularly interesting and natural, since randomness can be defined in terms of M itself. We show that there are universal semimeasures M which do not converge for all random sequences, i.e. we give a partial negative answer to the open problem. We also provide a positive answer for some non-universal semimeasures. We define the incomputable measure D as a mixture over all computable measures and the enumerable semimeasure W as a mixture over all enumerable nearly-measures. We show that W converges to D and D to µ on all random sequences. The Hellinger distance measuring closeness of two distributions plays a central role
Conditional complexity and codes
AbstractLet x and y be binary strings. We prove that there exists a program p of size about K(x|y) that maps y to x and has small complexity when x is known (K(p|x)≈0). Having in mind the parallelism between Shannon information theory and algorithmic information theory, one can say that this result is parallel to Wolf–Slepian and Körner–Csiszar–Marton theorems, see (I. Csiszar and J. Körner, Information theory, Coding Theorems for Discrete Memoryless Systems, Akadémiai Kiadó, Budapest, 1981).We show also that for any three strings x,y,z of length at most n the length of the shortest program p that maps both y and z to x (i.e., p(y)=p(z)=x) equals max(K(x|y),K(x|z)+O(logn)